Deep Gaussian Process Regression (DGPR)

نویسنده

  • Haibin Yu
چکیده

A Gaussian Process Regression model is equivalent to an infinitely wide neural network with single hidden layer and similarly a DGP is a multi-layer neural network with multiple infinitely wide hidden layers [Neal, 1995]. DGPs employ a hierarchical structural of GP mappings and therefore are arguably more flexible, have a greater capacity to generalize, and are able to provide better predictive performance [Damianou, 2015]. Then it comes into my mind that why we would like to proceed to be deep and what are the benefits about being deep. It has been argued that the addition of non-linear hidden layers can also potentially overcome practical limitations of shallow GPs [Bui et al., 2016]. So what are the limitations exactly? Actually a GPR model is fully specified by a mean function E [·] and the covariance function cov [·, ·]. Conventionally we manually set the mean function to be 0. Then we can say that a GPR model is fully specified by its covariance function which also can be denoted as the kernel. Let us briefly examine the priors on functions encoded by some commonly used kernels 2 Expressing Structure with Kernels

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Corrigendum to "Solving Dynamic Traveling Salesman Problem Using Dynamic Gaussian Process Regression"

This paper solves the dynamic traveling salesman problem (DTSP) using dynamic Gaussian Process Regression (DGPR) method. The problem of varying correlation tour is alleviated by the nonstationary covariance function interleaved with DGPR to generate a predictive distribution for DTSP tour. This approach is conjoined with Nearest Neighbor (NN) method and the iterated local search to track dynami...

متن کامل

Scalable Gaussian Process Regression Using Deep Neural Networks

We propose a scalable Gaussian process model for regression by applying a deep neural network as the feature-mapping function. We first pre-train the deep neural network with a stacked denoising auto-encoder in an unsupervised way. Then, we perform a Bayesian linear regression on the top layer of the pre-trained deep network. The resulting model, Deep-Neural-Network-based Gaussian Process (DNN-...

متن کامل

Using Deep Belief Nets to Learn Covariance Kernels for Gaussian Processes

We show how to use unlabeled data and a deep belief net (DBN) to learn a good covariance kernel for a Gaussian process. We first learn a deep generative model of the unlabeled data using the fast, greedy algorithm introduced by [7]. If the data is high-dimensional and highly-structured, a Gaussian kernel applied to the top layer of features in the DBN works much better than a similar kernel app...

متن کامل

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers. DGPs are nonparametric probabilistic models and as such are arguably more flexible, have a greater capacity to generalise, and provide better calibrated uncertainty estimates than alternative deep mod...

متن کامل

Practical Learning of Deep Gaussian Processes via Random Fourier Features

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic approach to flexibly quantify uncertainty and carry out model selection in various learning scenarios. In this work, we introduce a novel formulation of DGPs based on random Fourier features that we train using stochastic variational inference. Our proposal yields an efficient way of tra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017